Guaranteed Sparse Recovery under Linear Transformation

نویسندگان

  • Ji Liu
  • Lei Yuan
  • Jieping Ye
چکیده

We consider the following signal recovery problem: given a measurement matrix Φ ∈ Rn×p and a noisy observation vector c ∈ R constructed from c = Φθ∗ + where ∈ R is the noise vector whose entries follow i.i.d. centered sub-Gaussian distribution, how to recover the signal θ∗ if Dθ∗ is sparse under a linear transformation D ∈ Rm×p? One natural method using convex optimization is to solve the following problem: min θ 1 2 ‖Φθ − c‖ + λ‖Dθ‖1. This paper provides an upper bound of the estimate error and shows the consistency property of this method by assuming that the design matrix Φ is a Gaussian random matrix. Specifically, we show 1) in the noiseless case, if the condition number of D is bounded and the measurement number n ≥ Ω(s log(p)) where s is the sparsity number, then the true solution can be recovered with high probability; and 2) in the noisy case, if the condition number of D is bounded and the measurement increases faster than s log(p), that is, s log(p) = o(n), the estimate error converges to zero with probability 1 when p and s go to infinity. Our results are consistent with those for the special case D = Ip×p (equivalently LASSO) and improve the existing analysis. The condition number of D plays a critical role in our analysis. We consider the condition numbers in two cases including the fused LASSO and the random graph: the condition number in the Proceedings of the 30 th International Conference on Machine Learning, Atlanta, Georgia, USA, 2013. JMLR: W&CP volume 28. Copyright 2013 by the author(s). fused LASSO case is bounded by a constant, while the condition number in the random graph case is bounded with high probability if mp (i.e., #edge #vertex ) is larger than a certain constant. Numerical simulations are consistent with our theoretical results.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Recovery using Smoothed $\ell^0$ (SL0): Convergence Analysis

Finding the sparse solution of an underdetermined system of linear equations has many applications, especially, it is used in Compressed Sensing (CS), Sparse Component Analysis (SCA), and sparse decomposition of signals on overcomplete dictionaries. We have recently proposed a fast algorithm, called Smoothed l (SL0), for this task. Contrary to many other sparse recovery algorithms, SL0 is not b...

متن کامل

Sufficient Conditions for Low-rank Matrix Recovery, Translated from Sparse Signal Recovery

The low-rank matrix recovery (LMR) is a rank minimization problem subject to linear equality constraints, and it arises in many fields such as signal and image processing, statistics, computer vision, system identification and control. This class of optimization problems is NP-hard and a popular approach replaces the rank function with the nuclear norm of the matrix variable. In this paper, we ...

متن کامل

S-semigoodness for Low-Rank Semidefinite Matrix Recovery

We extend and characterize the concept of s-semigoodness for a sensing matrix in sparse nonnegative recovery (proposed by Juditsky , Karzan and Nemirovski [Math Program, 2011]) to the linear transformations in low-rank semidefinite matrix recovery. We show that ssemigoodness is not only a necessary and sufficient condition for exact s-rank semidefinite matrix recovery by a semidefinite program,...

متن کامل

The Restricted Isometry Property and ` sparse recovery failure

This paper considers conditions based on the restricted isometry constant (RIC) under which the solution of an underdetermined linear system with minimal ` norm, 0 < p ≤ 1, is guaranteed to be also the sparsest one. Specifically matrices are identified that have RIC, δ2m, arbitrarily close to 1/ √ 2 ≈ 0.707 where sparse recovery with p = 1 fails for at least one m-sparse vector. This indicates ...

متن کامل

Restricted Isometry Constants where p sparse recovery can fail for 0 < p ≤ 1

This paper investigates conditions under which the solution of an underdetermined linear system with minimal � norm, 0 < p ≤ 1, is guaranteed to be also the sparsest one. Matrices are constructed with restricted isometry constants (RIC) δ2m arbitrarily close to 1/ √ 2 ≈ 0.707 where sparse recovery with p = 1 fails for at least one m-sparse vector, as well as matrices with δ2m arbitrarily close ...

متن کامل

RESTRICTED ISOMETRY CONSTANTS WHERE l p SPARSE RECOVERY CAN FAIL FOR 0 < p ≤ 1

We investigate conditions under which the solution of an underdetermined linear system with minimal lp norm, 0 < p ≤ 1, is guaranteed to be also the sparsest one. Our results highlight the pessimistic nature of sparse recovery analysis when recovery is predicted based on the restricted isometry constants (RIC) of the associated matrix. We construct matrices with RIC δ2m arbitrarily close to 1/ ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013